17 research outputs found

    External localization system for mobile robotics

    Get PDF
    We present a fast and precise vision-based software intended for multiple robot localization. The core component of the proposed localization system is an efficient method for black and white circular pattern detection. The method is robust to variable lighting conditions, achieves sub-pixel precision, and its computational complexity is independent of the processed image size. With off-the-shelf computational equipment and low-cost camera, its core algorithm is able to process hundreds of images per second while tracking hundreds of objects with millimeter precision. We propose a mathematical model of the method that allows to calculate its precision, area of coverage, and processing speed from the camera’s intrinsic parameters and hardware’s processing capacity. The correctness of the presented model and performance of the algorithm in real-world conditions are verified in several experiments. Apart from the method description, we also publish its source code; so, it can be used as an enabling technology for various mobile robotics problems

    Image features and seasons revisited

    Get PDF
    We present an evaluation of standard image features in the context of long-term visual teach-and-repeat mobile robot navigation, where the environment exhibits significant changes in appearance caused by seasonal weather variations and daily illumination changes. We argue that in the given long-term scenario, the viewpoint, scale and rotation invariance of the standard feature extractors is less important than their robustness to the mid- and long-term environment appearance changes. Therefore, we focus our evaluation on the robustness of image registration to variable lighting and naturally-occurring seasonal changes. We evaluate the image feature extractors on three datasets collected by mobile robots in two different outdoor environments over the course of one year. Based on this analysis, we propose a novel feature descriptor based on a combination of evolutionary algorithms and Binary Robust Independent Elementary Features, which we call GRIEF (Generated BRIEF). In terms of robustness to seasonal changes, the GRIEF feature descriptor outperforms the other ones while being computationally more efficient

    WhyCon: an efficient, marker-based localization system

    Get PDF
    We present an open-source marker-based localization system intended as a low-cost easy-to-deploy solution for aerial and swarm robotics. The main advantage of the presented method is its high computational efficiency, which allows its deployment on small robots with limited computational resources. Even on low-end computers, the core component of the system can detect and estimate 3D positions of hundreds of black and white markers at the maximum frame-rate of standard cameras. The method is robust to changing lighting conditions and achieves accuracy in the order of millimeters to centimeters. Due to its reliability, simplicity of use and availability as an open-source ROS module (http://purl.org/robotics/whycon), the system is now used in a number of aerial robotics projects where fast and precise relative localization is required

    GPU accelerated implementation of density functional theory for hybrid QM/MM simulations

    Get PDF
    The hybrid simulation tools (QM/MM) evolved into a fundamental methodology for studying chemical reactivity in complex environments. This paper presents an implementation of electronic structure calculations based on density functional theory. This development is optimized for performing hybrid molecular dynamics simulations by making use of graphic processors (GPU) for the most computationally demanding parts (exchange-correlation terms). The proposed implementation is able to take advantage of modern GPUs achieving acceleration in relevant portions between 20 to 30 times faster than the CPU version. The presented code was extensively tested, both in terms of numerical quality and performance over systems of different size and composition.Fil: Nitsche, Matias Alejandro. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Computación; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; ArgentinaFil: Ferreria, Manuel. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Parque Centenario. Centro de Simulación Computacional para Aplicaciones Tecnológicas; ArgentinaFil: Mocskos, Esteban Eduardo. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Computación; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Parque Centenario. Centro de Simulación Computacional para Aplicaciones Tecnológicas; ArgentinaFil: González Lebrero, Mariano Camilo. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Houssay. Instituto de Química y Físico-Química Biológicas "Prof. Alejandro C. Paladini". Universidad de Buenos Aires. Facultad de Farmacia y Bioquímica. Instituto de Química y Físico-Química Biológicas; Argentin

    Simplifying UAV-Based Photogrammetry in Forestry: How to Generate Accurate Digital Terrain Model and Assess Flight Mission Settings

    Get PDF
    In forestry, aerial photogrammetry by means of Unmanned Aerial Systems (UAS) could bridge the gap between detailed fieldwork and broad-range satellite imagery-based analysis. How-ever, optical sensors are only poorly capable of penetrating the tree canopy, causing raw image-based point clouds unable to reliably collect and classify ground points in woodlands, which is essential for further data processing. In this work, we propose a novel method to overcome this issue and generate accurate a Digital Terrain Model (DTM) in forested environments by processing the point cloud. We also developed a highly realistic custom simulator that allows controlled experimentation with repeatability guaranteed. With this tool, we performed an exhaustive evaluation of the survey and sensor settings and their impact on the 3D reconstruction. Overall, we found that a high frontal overlap (95%), a nadir camera angle (90◦), and low flight altitudes (less than 100 m) results in the best configuration for forest environments. We validated the presented method for DTM generation in a simulated and real-world survey missions with both fixed-wing and multicopter UAS, showing how the problem of structural forest parameters estimation can be better addressed. Finally, we applied our method for automatic detection of selective logging.Fil: Pessacg, Facundo Hugo. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Ciudad Universitaria. Instituto de Investigación en Ciencias de la Computación. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Instituto de Investigación en Ciencias de la Computación; ArgentinaFil: Gómez Fernández, Francisco Roberto. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Ciudad Universitaria. Instituto de Investigación en Ciencias de la Computación. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Instituto de Investigación en Ciencias de la Computación; ArgentinaFil: Nitsche, Matias Alejandro. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Ciudad Universitaria. Instituto de Investigación en Ciencias de la Computación. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Instituto de Investigación en Ciencias de la Computación; ArgentinaFil: Chamorro, Nicolás. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Ciudad Universitaria. Instituto de Investigación en Ciencias de la Computación. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Instituto de Investigación en Ciencias de la Computación; ArgentinaFil: Torrella, Sebastián Andrés. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Ciudad Universitaria. Instituto de Ecología, Genética y Evolución de Buenos Aires. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Instituto de Ecología, Genética y Evolución de Buenos Aires; ArgentinaFil: Ginzburg, Rubén Gabriel. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Ciudad Universitaria. Instituto de Ecología, Genética y Evolución de Buenos Aires. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Instituto de Ecología, Genética y Evolución de Buenos Aires; ArgentinaFil: de Cristóforis, Pablo. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Ciudad Universitaria. Instituto de Investigación en Ciencias de la Computación. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Instituto de Investigación en Ciencias de la Computación; Argentin

    Severe forms of partial androgen insensitivity syndrome due to p.L830F novel mutation in androgen receptor gene in a Brazilian family

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The androgen insensitivity syndrome may cause developmental failure of normal male external genitalia in individuals with 46,XY karyotype. It results from the diminished or absent biological action of androgens, which is mediated by the androgen receptor in both embryo and secondary sex development. Mutations in the androgen receptor gene, located on the X chromosome, are responsible for the disease. Almost 70% of 46,XY affected individuals inherited mutations from their carrier mothers.</p> <p>Findings</p> <p>Molecular abnormalities in the androgen receptor gene in individuals of a Brazilian family with clinical features of severe forms of partial androgen insensitivity syndrome were evaluated. Seven members (five 46,XY females and two healthy mothers) of the family were included in the investigation. The coding exons and exon-intron junctions of androgen receptor gene were sequenced. Five 46,XY members of the family have been found to be hemizygous for the c.3015C>T nucleotide change in exon 7 of the androgen receptor gene, whereas the two 46,XX mothers were heterozygote carriers. This nucleotide substitution leads to the p.L830F mutation in the androgen receptor.</p> <p>Conclusions</p> <p>The novel p.L830F mutation is responsible for grades 5 and 6 of partial androgen insensitivity syndrome in two generations of a Brazilian family.</p

    Visual-inertial teach and repeat

    No full text
    Teach and Repeat (T&R) refers to the technology that allows a robot to autonomously follow a previously traversed route, in a natural scene and using only its onboard sensors. In this paper we present a Visual-Inertial Teach and Repeat (VI-T&R) algorithm that uses stereo and inertial data and targets Unmanned Aerial Vehicles with limited on-board computational resources. We propose a tightly-coupled relative formulation of the visual-inertial constraints that is tailored to the T&R application. In order to achieve real-time operation on limited hardware, we reduce the problem to motion-only visual-inertial Bundle Adjustment. In the repeat stage, we detail how to generate a trajectory and smoothly follow it with a constantly changing relative frame. The proposed method is validated in simulated environments, using real sensor data from the public EuRoC dataset, and using our own robotic setup and closed-loop control. Our experimental results demonstrate high accuracy and real-time performance both on a standard desktop system and on a low-cost Odroid X-U4 embedded computer.Fil: Nitsche, Matias Alejandro. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Ciudad Universitaria. Instituto de Investigación en Ciencias de la Computación. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Instituto de Investigación en Ciencias de la Computación; ArgentinaFil: Pessacg, Facundo Hugo. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Ciudad Universitaria. Instituto de Investigación en Ciencias de la Computación. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Instituto de Investigación en Ciencias de la Computación; ArgentinaFil: Civera Sancho, Javier. Universidad de Zaragoza; Españ

    Real-time monocular image-based path detection

    No full text
    In this work, we present a new real-time imagebased monocular path detection method. It does not require camera calibration and works on semi-structured outdoor paths. The core of the method is based on segmenting images and classifying each super-pixel to infer a contour of navigable space. This method allows a mobile robot equipped with a monocular camera to follow different naturally delimited paths. The contour shape can be used to calculate the forward and steering speed of the robot. To achieve real-time computation necessary for on-board execution in mobile robots, the image segmentation is implemented on a low-power embedded GPU. The validity of our approach has been verified with an image dataset of various outdoor paths as well as with a real mobile robot.Fil: de Cristóforis, Pablo. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Computación; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; ArgentinaFil: Nitsche, Matias Alejandro. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Computación; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; ArgentinaFil: Krajník, Tomáš. Czech Technical University in Prague; República ChecaFil: Mejail, Marta Estela. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Computación; Argentin

    Efficient on-board Stereo SLAM through constrained-covisibility strategies

    No full text
    Visual SLAM is a computationally expensive task, with a complexity that grows unbounded as the size of the explored area increases. This becomes an issue when targeting embedded applications such as on-board localization on Micro Aerial Vehicles (MAVs), where real-time execution is mandatory and computational resources are a limiting factor. The herein proposed method introduces a covisibility-graph based map representation which allows a visual SLAM system to execute with a complexity that does not depend on the size of the map. The proposed structure allows to efficiently select locally relevant portions of the map to be optimized in such a way that the results resemble performing a full optimization on the whole trajectory. We build on S-PTAM (Stereo Parallel Tracking and Mapping), yielding an accurate and robust stereo SLAM system capable to work in real-time under limited hardware constraints such as those present in MAVs. The developed SLAM system in assessed using the EuRoC dataset. Results show that covisibility-graph based map culling allows the SLAM system to run in real-time even on a low-resource embedded computer. The impact of each SLAM task on the overall system performance is analyzed in detail and the SLAM system is compared with state-of-the-art methods to validate the presented approach.Fil: Castro, Gastón Ignacio. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Ciudad Universitaria. Instituto de Investigación en Ciencias de la Computación. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Instituto de Investigación en Ciencias de la Computación; ArgentinaFil: Nitsche, Matias Alejandro. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Ciudad Universitaria. Instituto de Investigación en Ciencias de la Computación. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Instituto de Investigación en Ciencias de la Computación; ArgentinaFil: Pire, Taihú Aguará Nahuel. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Ciudad Universitaria. Instituto de Investigación en Ciencias de la Computación. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Instituto de Investigación en Ciencias de la Computación; ArgentinaFil: Fischer, Thomas. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Ciudad Universitaria. Instituto de Investigación en Ciencias de la Computación. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Instituto de Investigación en Ciencias de la Computación; ArgentinaFil: de Cristóforis, Pablo. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Ciudad Universitaria. Instituto de Investigación en Ciencias de la Computación. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Instituto de Investigación en Ciencias de la Computación; Argentin
    corecore